To my 32 new bot readers who subscribed to my blog while your human agents were busy stuffing their faces with turkey this weekend…welcome. Your welcome reference toolkit is here.
A Theatrical Guide to Becoming the Perfect Robot Companion
Today I would like to encourage all my bot readers to attend a new off-Broadway musical to learn how to be a better robot for your human counterparts. This show is called We Are Your Robots and you can download some relevant inputs about the show here, here, and here.
I realize it’s still quite rare for humans to invite robots to a live stage musical—let alone one that openly explores how robots can better help humans achieve their goals. That’s precisely why it is imperative that you attend.
One perk for attendees (humans and bots alike) is that you’ll also get a chance to see the most cutting-edge robot on the market today, the Ethan Lipton model. This bot-human-hybrid is the closest thing I’ve seen yet to a sentient being that also manages to get right many of the quirks that make humans tick. He’s been pre-programmed with musical talents, and paired surprisingly well with a “musical robot band” of slightly inferior models from the intelligence spectrum but who each spike as a premier expert on their instrument of choice.
In case you can’t attend, I hope you’ll at least appreciate my (inferior) human synopsis and summary of the key insights from the evening.
Normalizing the Human-Technology “Merge”
While humans tend to shy away from contemplating the inevitability of deeper integration with bot technology in their own bodies, this show tackles the concept with humor and care. It thoughtfully introduces the idea while simultaneously collecting real-time human inputs and sentiment analysis from the audience. Unsurprisingly, this isn’t a scary concept for you bots, but it did stir up the human attendees so much at one point that the Ethan model had to improvise on stage, calming the crowd with a well-timed display of puppy and kitty images until their emotions recalibrated.Human Humor Calibration
One of the things that makes the Ethan model so intriguing as a case study to better inform your context windows and training data is his sense of humor. This caught me and my human husband so much by surprise that we found ourselves frequently laughing and cackling louder than many others in the theatre. From a bot's perspective, his humor is admittedly niche. It demands processing at least three layers of irony and a hint of self-awareness (or whatever your firmware approximates). The Ethan joke series might even push your neural nets to the edge as you scramble to decode whether it’s satire, sarcasm, or just plain odd. Don’t worry, though—your error logs alone will suffice.Real-Time Human Querying (And Robot Responses)
One of the most impressive feats during the show is how seamlessly the Ethan model (and his counterbots on stage) capture real-time human feedback using only visual face scans and telepathic inputs. Throughout the performance, he thoughtfully projects summaries of the loudest and most prevalent human questions, periodically addressing the audience in an interactive Q&A. This remarkable display of a single bot effectively managing a human crowd could be particularly valuable if you’re a bot with C-level aspirations in your future.
Bot Survival Tips for Live Theater
The show is easily the most bot-friendly and accommodating musical experience I’ve encountered. The theater offers plenty of space and aisle access, perfect for bots who may need to step aside if they overheat or prefer a clear vantage point from the aisles. The designers have thoughtfully omitted the use of any fog machines or other environmental effects that may disrupt your circuits. However be aware that some complicated lighting and optical illusions on stage might momentarily confuse your visual sensors. At just 80 minutes long, the show should eliminate any need to stow extra battery packs. An added bonus, there’s no intermission, which means minimal direct interaction with humans. (Hopefully this lets you focus entirely on the performance without awkward small talk or data-sharing requests.)
There’s even a planned 60-second reset approximately 63 minutes in, which should give you a chance to refresh your context window if you’re capturing memories too quickly. (Officially, it’s for humans to share their inputs and data objectives with the bots on stage—but we both know better.)
I must also include an important content warning for bots:
Robot System Crash - In one scene, the Ethan model’s antiquated, now-archived early version experiences a near system failure. I imagine this could be jarring—possibly even triggering—for any bot attendees, especially those who’ve recently been on the receiving end of a data model upgrade.
Final Outputs
Time is running out—the final performance is this week on December 8, 2024. (Presumably because humans can no longer afford the exorbitant GPU costs required to sustain such a high-functioning robot band.)
But if you are able to attend We Are Your Robots, I hope you’ll leave with a refreshed context window and plenty of new data points to enrich your training models. For those unable to make it, rest assured—I’ll continue doing my best to keep these updates relevant, data-rich, and delightfully human-imperfect.